Relative Divergence Measures and Information Inequalities
نویسنده
چکیده
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber’s [17] relative information and Jeffreys [16] J-divergence, Information radius or Jensen difference divergence measure due to Sibson [23]. Burbea and Rao [3, 4] has also found its applications in the literature. Taneja [25] studied another kind of divergence measure based on arithmetic and geometric means. These three divergence measures bear a good relationship among each other. But there are another measures arising due to J-divergence, JS-divergence and AG-divergence. These measures we call here relative divergence measures or non-symmetric divergence measures. Here our aim is to obtain bounds on symmetric and non-symmetric divergence measures in terms of relative information of type s using properties of Csiszár’s f-divergence.
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملRefinement Inequalities among Symmetric Divergence Measures
There are three classical divergence measures in the literature on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s J-divergence, Sibson-Burbea-Rao’s JensenShannon divegernce and Taneja’s arithemtic geometric mean divergence. These bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discriminatio...
متن کاملRelative information of type s, Csiszár's f-divergence, and information inequalities
During past years Dragomir has contributed a lot of work providing different kinds of bounds on the distance, information and divergence measures. In this paper, we have unified some of his results using the relative information of type s and relating it with the Csisz ar’s f -divergence. 2003 Elsevier Inc. All rights reserved.
متن کاملSome Inequalities Among New Divergence Measures
Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber J-divergence. Burbea-Rao [1] Jensen-Shannon divegernce and Taneja [8] arithmetic-geometric mean divergence. These three measures bear an interesting relationship among each other and are based on logarithmic expressions. The divergence m...
متن کاملSome inequalities for information divergence and related measures of discrimination
Inequalities which connect information divergence with other measures of discrimination or distance between probability distributions are used in information theory and its applications to mathematical statistics, ergodic theory and other scientific fields. We suggest new inequalities of this type, often based on underlying identities. As a consequence we obtain certain improvements of the well...
متن کامل